LSV-Based Tail Inequalities for Sums of Random Matrices
نویسندگان
چکیده
منابع مشابه
Dimension-free tail inequalities for sums of random matrices
We derive exponential tail inequalities for sums of random matrices with no dependence on the explicit matrix dimensions. These are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the dimension is large or infinite. Some applications to principal component analysis a...
متن کاملTail inequalities for sums of random matrices that depend on the intrinsic dimension
This work provides exponential tail inequalities for sums of random matrices that depend only on intrinsic dimensions rather than explicit matrix dimensions. These tail inequalities are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the explicit dimensions are large...
متن کاملUser-Friendly Tail Bounds for Sums of Random Matrices
This work presents probability inequalities for sums of independent, random, selfadjoint matrices. The results frame simple, easily verifiable hypotheses on the summands, and they yield strong conclusions about the large-deviation behavior of the maximum eigenvalue of the sum. Tail bounds for the norm of a sum of rectangular matrices follow as an immediate corollary, and similar techniques yiel...
متن کاملComplete Convergence and Some Maximal Inequalities for Weighted Sums of Random Variables
Let be a sequence of arbitrary random variables with and , for every and be an array of real numbers. We will obtain two maximal inequalities for partial sums and weighted sums of random variables and also, we will prove complete convergence for weighted sums , under some conditions on and sequence .
متن کاملMoment inequalities for sums of random matrices and their applications in optimization
In this paper, we consider various moment inequalities for sums of random matrices—which are well–studied in the functional analysis and probability theory literature—and demonstrate how they can be used to obtain the best known performance guarantees for several problems in optimization. First, we show that the validity of a recent conjecture of Nemirovski is actually a direct consequence of t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 2017
ISSN: 0899-7667,1530-888X
DOI: 10.1162/neco_a_00901